Previous Blogs

October 5, 2022
From PCs to Cars: Nvidia, Qualcomm & Intel Race to Automotive Semis

September 14, 2022
Intel Highlights Key PC Platform Innovations for 13th Gen Core CPUs

September 1, 2022
VMware’s vSphere 8 Brings DPUs from AMD, Intel, and Nvidia to Life

August 17, 2022
Cloudera CDP One Brings Advanced Data Management to the Mainstream

August 10, 2022
IBM Research Tech Makes Edge AI Applications Scalable

July 20, 2022
Amazon Extends Alexa’s Reach with New Tools

July 19, 2022
Qualcomm Accelerates Wearables with W5 Platforms

July 12, 2022
New Research Highlights Opportunities and Challenges for Private 5G

June 29, 2022
Arm Aims to Make Mobile Graphics “Immortal-is”

June 14, 2022
Cisco Brings Simplicity and Observability to Networks, Collaboration and Cloud Apps

May 24, 2022
Microsoft Unveils Foundation for AI-Powered Client/Cloud Hybrid Loop

May 18, 2022
Citrix to Integrate with Microsoft Windows 365

May 3, 2022
Dell Expands APEX, Adds Analytics and Data Recovery

April 27, 2022
Arm Simplifies and Modernizes IoT Development with Virtual Hardware

April 21, 2022
Amazon’s Launch of Buy with Prime Highlights Growth of Logistics Business

March 30, 2022
Intel Spices Up PC Market with Arc GPU Launch

March 22, 2022
Nvidia GTC Announcements Confirm it’s a Connected, Multi-Chip World

March 15, 2022
Lenovo and AMD Announcements Highlight Spring PC Refresh

March 8, 2022
The Future of Semiconductors is UCIe

March 2, 2022
Qualcomm Demos Future of Connectivity with WiFi 7 and X70 5G Chips

February 24, 2022
5G Edge Computing Challenges Remain

February 9, 2022
Samsung Raises the Bar with Ultra Versions of S22 and Tab S8

January 20, 2022
US 5G Market Just Got Much More Interesting

January 4, 2022
Qualcomm Extends Automotive Offerings with Snapdragon Ride Vision, Digital Chassis

2021 Blogs

2020 Blogs

2019 Blogs

2018 Blogs

2017 Blogs

2016 Blogs

2015 Blogs

2014 Blogs

2013 Blogs


















TECHnalysis Research Blog

October 11, 2022
Google Unveils a Host of Open Data and AI Advancements at Cloud Next

By Bob O'Donnell

When it comes to technology-based products, concepts are often significantly more elegant than reality. Capabilities and functions that sound logical and straightforward often prove to be much more complicated or arduous than they first appear.

Part of the problem, of course, is that many of the most advanced technologies really are complex, and it can be very difficult to bring them to life. But an even more common problem is that pre-existing requirements aren’t fully explained, or the number of steps required can prove to be much more challenging than first appears. To put it simply, “the devil is in the details.”

This, of course, is true of many cloud and AI-focused technologies. High-level product ideas, such as the ability to quickly analyze any type of data to help generate artificial intelligence (AI) or machine learning (ML)-driven models using the new types of hardware accelerators, have been talked about for years. As Google made clear through several announcements at its Cloud Next event, however, there are a lot of important details that need to be in place in order for these ideas to become reality.

To start with, not all data analysis tools and data platforms can work with any type of data. That’s why the ability to import, or ingest, new and different format data types into a wider range of analytics tools is so important. Opening up the ability for data platforms like Elastic to get access to data stored on Google Cloud, and Google bringing support for Elastic into its newly expanded Looker line of business analytics tools, for example, are just two of the many open data-related announcements made at Cloud Next.

Similarly, different types of data are often stored in different formats, and analytics tools have to specifically enable support for these data structures in order to make them more useful to a wider variety of users and application developers. In the growing field of data lakehouses, for example, where large “lakes” of unstructured data, such as video and audio, are allowed to be queried with the kinds of tools typically found in structured data warehouses, the open-source Apache Iceberg table format is becoming increasingly popular. That’s why Google added support for it and other formats, including Delta and Hudi, to its BigLake storage engine and added support for analyzing unstructured data to its BigQuery data analytics tools. Not only does this provide additional flexibility, but it also means unstructured data can leverage other Google Cloud Platform (GCP) Big Query tools, including ML functions like speech recognition, computer vision, text processing and more.

Another important area of development has to do with the use of various types of hardware accelerator chips to improve AI model performance. Google itself has created several generations of TPUs (tensor processing units), for example, that offer important benefits for applications like AI model training or inference. In addition, there have been many recent announcements from established semiconductor companies like Intel, AMD, Nvidia, and Qualcomm as well as a slew of chip startups focused on this burgeoning area.

As you might expect, each of those chip companies use different techniques to perform the acceleration of AI and ML models. What isn’t as clear, however, is that the methods necessary to write software or create models for the different accelerators is also proprietary. As a result, it can be very challenging for software developers and AI/ML model creators to take advantage of these different chips because of how difficult it can be to learn all these unique approaches.

In order to address this, a couple of Google’s more intriguing announcements from Cloud Next are the launch of a new industry consortium called the OpenXLA Project and the debut of some new open-source software tools designed to ease the process of working with multiple different types of hardware accelerators. In particular, OpenXLA is designed to increase the flexibility of choices that AI/ML developers have by providing connections between most of the popular front-end frameworks used for building AI models—including TensorFlow, PyTorch and JAX—and a host of different hardware accelerator backends. The initial software tools being released include an upgraded XLA compiler and a portable set of ML computing operations called StableHLO.

Companies that have also joined Google in the initiative include Intel, Amazon Web Services, AMD, Nvidia, Arm, Meta and more. The inclusion of Intel is particularly interesting, because, in many ways, the goal of the OpenXLA Project is similar to Intel’s own OneAPI initiative, which is targeted at allowing developers to leverage Intel’s several types of computing architectures such as CPUs, GPUs, and Habana Gaudi AI accelerators, without having to learn how to program for each of the different chip types. OpenXLA takes that concept to an industry-wide level and, thanks to the inclusion of many key cloud computing players, should open up a number of important new opportunities, and speed along the adoption of hardware accelerators.

Like many of the announcements Google made at Cloud Next, the real-world benefits of the OpenXLA Project and the tools associated with it will take some time to make a significant impact. In the big picture of tech industry trends, these tools may seem a bit modest on their own. Collectively, however, they represent very important steps forward and are indicative of the kinds of efforts Google is making to make its tools more useful to a wider audience of people. They also reflect a strong emphasis on open-source tools and a desire to make to make its Google Cloud Platform and related offerings more transparent and more flexible. The process of leveraging all the technology tools that Google offers is still undoubtedly complex, but with the broad collection of announcements that the company unveiled at Cloud Next, it is clear that the company’s evolution as a major cloud provider continues to advance.

Here's a link to the original column: https://www.linkedin.com/pulse/google-unveils-host-open-data-ai-advancements-cloud-next-o-donnell

Bob O’Donnell is the president and chief analyst of TECHnalysis Research, LLC a market research firm that provides strategic consulting and market research services to the technology industry and professional financial community. You can follow him on Twitter @bobodtech.

b